|
Sequential minimal optimization (SMO) is an algorithm for solving the quadratic programming (QP) problem that arises during the training of support vector machines. It was invented by John Platt in 1998 at Microsoft Research. SMO is widely used for training support vector machines and is implemented by the popular LIBSVM tool.〔Luca Zanni (2006). ''(Parallel Software for Training Large Scale Support Vector Machines on Multiprocessor Systems )''.〕 The publication of the SMO algorithm in 1998 has generated a lot of excitement in the SVM community, as previously available methods for SVM training were much more complex and required expensive third-party QP solvers. == Optimization problem == (詳細はbinary classification problem with a dataset (''x''1, ''y''1), ..., (''x''''n'', ''y''''n''), where ''x''''i'' is an input vector and is a binary label corresponding to it. A soft-margin support vector machine is trained by solving a quadratic programming problem, which is expressed in the dual form as follows: : :subject to: : : where ''C'' is an SVM hyperparameter and ''K''(''x''''i'', ''x''''j'') is the kernel function, both supplied by the user; and the variables are Lagrange multipliers. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Sequential minimal optimization」の詳細全文を読む スポンサード リンク
|